112 research outputs found

    A Report on Stochastic Fairness Queueing (SFQ) Experiments

    Get PDF
    SRI International (SRI) has developed an improved queueing algorithm, known as Stochastic Fairness Queueing (SFQ), for best-effort traffic (i.e., traffic that does not require any guaranteed service). SFQ is a probablistic variant of strict fair queueing where instead of a single queue being allocated per flow, a fixed number of queues are used and a hash function maps the IP source and destination to a particular queue. A seed to the hash function is also perturbed occasionally to help distribute the flows amongst different queues when more than one flow maps to the same queue during the lifetime of the flow. SFQ provides 'fair' access by trying to ensure that each flow from source to destination host obtains equal access to the available bandwidth. This report covers a series of experiments performed on DARTnet evaluating the behavior and performance of SFQ against a FIFO queueing discipline. These experiments were designed to show SFQ's advantages and performance, and include tests demonstrating: Fair utilization of available resources; Starvation prevention; Graceful degradation under overload conditions; and Resource usage. In general, the experiments do show that SFQ is better than FIFO queueing at allocating bandwidth equally among a set of flows. SFQ also prevents a stream from dominating the available bandwidth, which seems to be a tendency with FIFO queueing (i.e., if a flow demands more than its share of the available bandwidth, with FIFO queueing that stream receives a disproportionate amount when compared to flows demanding less than their share). Furthermore, SFQ seems to reward 'nice' users of the network by providing a lower variance in delay and more throughput when their resource demand is less than their available share. Both SFQ and FIFO queueing seem to degrade fairly well as the network becomes saturated and to recover well as the network becomes less congested. Not unexpectedly, FIFO queueing is a little more efficient than SFQ-the delays are less and the throughput slightly higher because SFQ requires more processing. However, the performance difference between the two queueing disciplines is relatively small. However, the experiments do point out some interesting behavior. FIFO queueing can behave better than SFQ with seed perturbation. We recommend further evaluation of the hash function and the seed perturbation technique. There are probably weaknesses in their current selection that cause this unexpected behavior. SFQ also seems to possess good scaling properties. To verify this, more experiments with a larger number of streams from more hosts need to be executed and examined, including the staggered introduction of streams. Staggering the streams may prove important, because graphs in the degradation experiment revealed some unexpected increases and decreases in throughput, which should be examined. This may again be due to the interaction of the hash function with the seed perturbation but it may also be related to some other unknown problem

    The Allocation of European Union Allowances: Lessons, Unifying Themes and General Principles

    Get PDF
    This paper is the concluding chapter of Rights, Rents and Fairness: Allocation in the European Emissions Trading Scheme, edited by the co-authors and forthcoming from Cambridge University Press. The main objective of this paper is to distill the lessons and general principles to be learnt from the allocation of allowances in the European Union Emission Trading Scheme (EU ETS), i.e. in the world’s first experience with allocating carbon allowances to sub-national entities. We discuss the lessons that emerge from this experience and make some comments on what seem to be more general principles informing the allocation process and on what are the global implications of the EU ETS. As has become obvious during the first allocation phase, the diversity of experience among the Member States is considerable, so that it must be understood that these lessons and unifying themes are drawn from the experience of most of the Member States, not necessarily from all. Lessons and unifying observations are grouped in three categories: those concerning the conditions encountered, the processes employed, and the actual choices.Climate Change, Emission Trading, Allocation, Fairness, EU Policy

    Over-allocation or abatement? : a preliminary analysis of the EU ETS based on the 2005 emission data

    Get PDF
    This paper provides an initial analysis of the EU ETS based on the installation-level data for verified emissions and allowance allocations in the first trading year. Those data, released on May 15, 2006, and subsequent updates revealed that CO2 emissions were about 4% lower than the allocated allowances. The main objective of the paper is to shed light on the extent to which over-allocation and abatement have taken place in 2005. We propose a measure by which over-allocation can be judged and provide estimates of abatement based on emissions data and indicators of economic activity as well as trends in energy and carbon intensity. Finally, we discuss the insights and implications that emerge from this tentative assessment

    Over-Allocation or Abatement? A Preliminary Analysis of the EU Emissions Trading Scheme Based on the 2005 Emissions Data

    Get PDF
    Abstract in HTML and technical report in PDF available on the Massachusetts Institute of Technology Joint Program on the Science and Policy of Global Change website (http://mit.edu/globalchange/www/).This paper provides an initial analysis of the European Union Emissions Trading Scheme (EU ETS) based on the installation-level data for verified emissions and allowance allocations in the first trading year. Those data, released on May 15, 2006, and subsequent updates revealed that CO2 emissions were about 4% lower than the allocated allowances. The main objective of the paper is to shed light on the extent to which over-allocation and abatement have taken place in 2005. We propose a measure by which over-allocation can be judged and provide estimates of abatement based on emissions data and indicators of economic activity as well as trends in energy and carbon intensity. Finally, we discuss the insights and implications that emerge from this tentative assessment.This study received partial support from the MIT Joint Program on the Science and Policy of Global Change, which is supported by a consortium of government, industry and foundation sponsors

    The Allocation of European Union Allowances: Lessons, Unifying Themes and General Principles

    Get PDF
    Abstract in HTML and technical report in PDF available on the Massachusetts Institute of Technology Joint Program on the Science and Policy of Global Change website (http://mit.edu/globalchange/www/).A critical issue in dealing with climate change is deciding who has a right to emit carbon dioxide (CO2), and under what conditions, when those emissions are limited. The European Union Emissions Trading Scheme (EU ETS) is the world’s first large experiment with an emission trading system for CO2 and it is likely to be copied by others if there is to be a global regime for limiting greenhouse gas emissions. This paper provides the first in-depth description and analysis of the process by which rights to emit carbon dioxide were created and distributed in the EU ETS. The main objective of the paper is to distill the lessons and general principles to be learned from the allocation of allowances in the EU ETS, i.e. in the world’s first experience with allocating carbon allowances to sub-national entities. We discuss the lessons and unifying observations that emerge from this experience and provide some insights on what seem to be more general principles informing the allocation process and on what are the global implications of the EU ETS.We are indebted to the European Commission, FEEM, and MIT’s Center for Energy and Environmental Policy Research and Joint Program on the Science and Policy of Global Change for the various forms of financial and other support that made possible the book on which this article is based

    Regular Topologies for Gigabit Wide-Area Networks: Congestion Avoidance Testbed Experiments

    Get PDF
    This document is Volume 3 of the final technical report on the work performed by SRI International (SRI) on SRI Project 8600. The document includes source listings for all software developed by SRI under this effort. Since some of our work involved the use of ST-II and the Sun Microsystems, Inc. (Sun) High-Speed Serial Interface (HSI/S) driver, we have included some of the source developed by LBL and BBN as well. In most cases, our decision to include source developed by other contractors depended on whether it was necessary to modify the original code. If we have modified the software in any way, it is included in this document. In the case of the Traffic Generator (TG), however, we have included all the ST-II software, even though BBN performed the integration, because the ST-II software is part of the standard TG release. It is important to note that all the code developed by other contractors is in the public domain, so that all software developed under this effort can be re-created from the source included here

    Congestion Avoidance Testbed Experiments

    Get PDF
    DARTnet provides an excellent environment for executing networking experiments. Since the network is private and spans the continental United States, it gives researchers a great opportunity to test network behavior under controlled conditions. However, this opportunity is not available very often, and therefore a support environment for such testing is lacking. To help remedy this situation, part of SRI's effort in this project was devoted to advancing the state of the art in the techniques used for benchmarking network performance. The second objective of SRI's effort in this project was to advance networking technology in the area of traffic control, and to test our ideas on DARTnet, using the tools we developed to improve benchmarking networks. Networks are becoming more common and are being used by more and more people. The applications, such as multimedia conferencing and distributed simulations, are also placing greater demand on the resources the networks provide. Hence, new mechanisms for traffic control must be created to enable their networks to serve the needs of their users. SRI's objective, therefore, was to investigate a new queueing and scheduling approach that will help to meet the needs of a large, diverse user population in a "fair" way

    Regular Topologies for Gigabit Wide-Area Networks

    Get PDF
    In general terms, this project aimed at the analysis and design of techniques for very high-speed networking. The formal objectives of the project were to: (1) Identify switch and network technologies for wide-area networks that interconnect a large number of users and can provide individual data paths at gigabit/s rates; (2) Quantitatively evaluate and compare existing and proposed architectures and protocols, identify their strength and growth potentials, and ascertain the compatibility of competing technologies; and (3) Propose new approaches to existing architectures and protocols, and identify opportunities for research to overcome deficiencies and enhance performance. The project was organized into two parts: 1. The design, analysis, and specification of techniques and protocols for very-high-speed network environments. In this part, SRI has focused on several key high-speed networking areas, including Forward Error Control (FEC) for high-speed networks in which data distortion is the result of packet loss, and the distribution of broadband, real-time traffic in multiple user sessions. 2. Congestion Avoidance Testbed Experiment (CATE). This part of the project was done within the framework of the DARTnet experimental T1 national network. The aim of the work was to advance the state of the art in benchmarking DARTnet's performance and traffic control by developing support tools for network experimentation, by designing benchmarks that allow various algorithms to be meaningfully compared, and by investigating new queueing techniques that better satisfy the needs of best-effort and reserved-resource traffic. This document is the final technical report describing the results obtained by SRI under this project. The report consists of three volumes: Volume 1 contains a technical description of the network techniques developed by SRI in the areas of FEC and multicast of real-time traffic. Volume 2 describes the work performed under CATE. Volume 3 contains the source code of all software developed under CATE

    Qualitative study of system-level factors related to genomic implementation

    Get PDF
    PURPOSE: Research on genomic medicine integration has focused on applications at the individual level, with less attention paid to implementation within clinical settings. Therefore, we conducted a qualitative study using the Consolidated Framework for Implementation Research (CFIR) to identify system-level factors that played a role in implementation of genomic medicine within Implementing GeNomics In PracTicE (IGNITE) Network projects. METHODS: Up to four study personnel, including principal investigators and study coordinators from each of six IGNITE projects, were interviewed using a semistructured interview guide that asked interviewees to describe study site(s), progress at each site, and factors facilitating or impeding project implementation. Interviews were coded following CFIR inner-setting constructs. RESULTS: Key barriers included (1) limitations in integrating genomic data and clinical decision support tools into electronic health records, (2) physician reluctance toward genomic research participation and clinical implementation due to a limited evidence base, (3) inadequate reimbursement for genomic medicine, (4) communication among and between investigators and clinicians, and (5) lack of clinical and leadership engagement. CONCLUSION: Implementation of genomic medicine is hindered by several system-level barriers to both research and practice. Addressing these barriers may serve as important facilitators for studying and implementing genomics in practice

    The content of attenuated psychotic symptoms in those at clinical high risk for psychosis

    Get PDF
    Recent research has started to focus on identifying individuals who are at clinical high risk of developing psychosis as a means to try and understand the predictors and mechanisms involved in the progress to a full psychotic episode. The aim of the current study was to provide an initial description and prevalence rates of specific content found within attenuated positive symptoms. The Content of Attenuated Positive Symptoms (CAPS) codebook was used by independent raters to determine the presence of content within a sample of written vignettes. Krippendorff's alpha was used to determine inter-rater reliability. Overall, the majority of items fell in or above an acceptable range of reliability. There was heterogeneity present in the types of content endorsed. However, the most commonly endorsed items included being perplexed by reality, increased hypervigilence, being gifted, hearing indistinct and distinct sounds, seeing figures or shadows, something touching the individual, and unpleasant smells. The use of the CAPS codebook is a reliable way to code the content of attenuated positive symptoms. Identifying and monitoring the presence of certain content may provide insight into the presence of other comorbid issues and the potential for future conversion
    corecore